564 research outputs found

    Study of Multiferroic Properties of Ferroelectric- Ferromagnetic Heterostructures BZT-BCT/LSMO

    Get PDF
    Currently, there has been a flurry of research focused on multiferroic materials due to their potential applications. Lead (Pb)-based ferroelectric and multiferroic materials (PZT, PMN-PT, PZN-PT etc.) have been widely used for sensors, actuators, and electro-mechanical applications due to their excellent dielectric and piezoelectric properties. However, these materials are facing global restriction due to the toxicity of Pb. In this thesis, multiferroic properties of ferroelectric-ferromagnetic heterostructures consist of Pb-free perovskite oxides 0.5Ba(Zr0.2Ti0.8)O3-0.5 (Ba0.7 Ca0.3)TiO3 (BZT-BCT) and La0.7Sr0.3MnO3 (LSMO) have been studied. The heterostructures BZT-BCT/LSMO were fabricated on LaAlO3 (LAO) and Pt substrates by pulsed laser deposition. Structural and crystalline qualities of the films have been investigated through theta-2theta scan, rocking curve, and phi-scan of X-Ray diffraction (XRD) and Raman spectroscopy. Ferroelectric and ferromagnetic properties have been characterized using the Sawyer-Tower method, a SQUID magnetometer, and Ferromagnetic resonance (FMR) spectroscopy. A well-behaved magnetization-magnetic field (M-H) hysteresis has been observed in LSMO as well as heterostructures, indicating ferromagnetism in the films. FMR spectroscopy data support the static magnetization data obtained using SQUID. These results may guide the development of next-generation lead-free ferroelectric-ferromagnetic heterostructures for magnetoelectric device applications

    A Machine Learning Framework for Identifying Molecular Biomarkers from Transcriptomic Cancer Data

    Get PDF
    Cancer is a complex molecular process due to abnormal changes in the genome, such as mutation and copy number variation, and epigenetic aberrations such as dysregulations of long non-coding RNA (lncRNA). These abnormal changes are reflected in transcriptome by turning oncogenes on and tumor suppressor genes off, which are considered cancer biomarkers. However, transcriptomic data is high dimensional, and finding the best subset of genes (features) related to causing cancer is computationally challenging and expensive. Thus, developing a feature selection framework to discover molecular biomarkers for cancer is critical. Traditional approaches for biomarker discovery calculate the fold change for each gene, comparing expression profiles between tumor and healthy samples, thus failing to capture the combined effect of the whole gene set. Also, these approaches do not always investigate cancer-type prediction capabilities using discovered biomarkers. In this work, we proposed a machine learning-based framework to address all of the above challenges in discovering lncRNA biomarkers. First, we developed a machine learning pipeline that takes lncRNA expression profiles of cancer samples as input and outputs a small set of key lncRNAs that can accurately predict multiple cancer types. A significant innovation of our work is its ability to identify biomarkers without using healthy samples. However, this initial framework cannot identify cancer-specific lncRNAs. Second, we extended our framework to identify cancer type and subtype-specific lncRNAs. Third, we proposed to use a state-of-the-art deep learning algorithm concrete autoencoder (CAE) in an unsupervised setting, which efficiently identifies a subset of the most informative features. However, CAE does not identify reproducible features in different runs due to its stochastic nature. Thus, we proposed a multi-run CAE (mrCAE) to identify a stable set of features to address this issue. Our deep learning-based pipeline significantly extended the previous state-of-the-art feature selection techniques. Finally, we showed that discovered biomarkers are biologically relevant using literature review and prognostically significant using survival analyses. The discovered novel biomarkers could be used as a screening tool for different cancer diagnoses and as therapeutic targets

    Thin Film Studies Toward Improving the Performance of Accelerator Electron Sources

    Get PDF
    Future electron accelerators require DC high voltage photoguns to operate beyond the present state of the art to conduct new experiments that require ultra-bright electron beams with high average current and higher bunch charge. To meet these demands, the accelerators must demonstrate improvements in a number of photogun areas including vacuum, field emission elimination in high voltage electrodes, and photocathodes. This dissertation illustrates how these improvements can be achieved by the application of suitable thin-films to the photogun structure for producing ultra-bright electron beams. This work is composed of three complementary studies. First, the outgassing rates of three nominally identical 304L stainless steel vacuum chambers were studied to determine the effects of chamber coatings (silicon and titanium nitride) and heat treatments. For an uncoated stainless steel chamber, the diffusion limited outgassing was taken over by the recombination limited process as soon as a low outgassing rate of ~1.79(+/-0.05) x 10—13 Torr L s—1 cm—2 was achieved. An amorphous silicon coating on the stainless steel chambers exhibited recombination limited behavior and any heat treatment became ineffective in reducing the outgassing rate. A TiN coated chamber yielded the smallest apparent outgassing rate of all the chambers: 6.44(+/-0.05) x 10—13 Torr L s—1 cm—2 following an initial 90 °C bake and 2(+/-20) x 10—16 Torr L s—1 cm—2 following the final bake in the series. This perceived low outgassing rate was attributed to the small pumping nature of TiN coating itself. Second, the high voltage performance of three TiN-coated aluminum electrodes, before and after gas conditioning with helium, were compared to that of bare aluminum electrodes and electrodes manufactured from titanium alloy (Ti-6Al-4V). This study suggests that aluminum electrodes, coated with TiN, could simplify the task of implementing photocathode cooling, which is required for future high current electron beam applications. The best performing TiN-coated aluminum electrode demonstrated less than 15 pA of field emission current at —175 kV for a 10 mm cathode/anode gap, which corresponds to a field strength of 22.5 MV/m. Third, the effect of antimony thickness on the performance of bialkali-antimonide photocathodes was studied. The high-capacity effusion source enabled us to successfully manufacture photocathodes having a maximum QE around 10% and extended low voltage 1/e lifetime (\u3e 90 days) at 532 nm via the co-deposition method, with relatively thick layers of antimony (≥ 300 nm). We speculate that alkali co-deposition provides optimized stoichiometry for photocathodes manufactured using thick Sb layers, which could serve as a reservoir for the alkali. In summary, this research examined the effectiveness of thin films applied on photogun chamber components to achieve an extremely high vacuum, to eliminate high voltage induced field emission from electrodes, and to generate photocurrent with high quantum yield with an extended operational lifetime. Simultaneous implementation of these findings can meet the challenges of future ultra-bright photoguns

    Engineering the Hardware/Software Interface for Robotic Platforms - A Comparison of Applied Model Checking with Prolog and Alloy

    Full text link
    Robotic platforms serve different use cases ranging from experiments for prototyping assistive applications up to embedded systems for realizing cyber-physical systems in various domains. We are using 1:10 scale miniature vehicles as a robotic platform to conduct research in the domain of self-driving cars and collaborative vehicle fleets. Thus, experiments with different sensors like e.g.~ultra-sonic, infrared, and rotary encoders need to be prepared and realized using our vehicle platform. For each setup, we need to configure the hardware/software interface board to handle all sensors and actors. Therefore, we need to find a specific configuration setting for each pin of the interface board that can handle our current hardware setup but which is also flexible enough to support further sensors or actors for future use cases. In this paper, we show how to model the domain of the configuration space for a hardware/software interface board to enable model checking for solving the tasks of finding any, all, and the best possible pin configuration. We present results from a formal experiment applying the declarative languages Alloy and Prolog to guide the process of engineering the hardware/software interface for robotic platforms on the example of a configuration complexity up to ten pins resulting in a configuration space greater than 14.5 million possibilities. Our results show that our domain model in Alloy performs better compared to Prolog to find feasible solutions for larger configurations with an average time of 0.58s. To find the best solution, our model for Prolog performs better taking only 1.38s for the largest desired configuration; however, this important use case is currently not covered by the existing tools for the hardware used as an example in this article.Comment: Presented at DSLRob 2013 (arXiv:cs/1312.5952

    A modified laboratory approach to determine reaeration rate for river water

    Get PDF
    It is reported that reaeration rates determined from laboratory investigation may not suit well in predicting reaeration rate of natural streams. Sampling method during reaeration experiment is a potential source of error in laboratory estimation of reaeration rate coefficient for river waters, which has been addressed in this research. A modified method based on sampling procedure in a flume was adopted to develop a reaeration rate equation for Pusu River in Malaysia,which is demographically a very important river. An important feature including several culverts along the course of the river was also considered to model dissolved oxygen (DO) concentration.DOwas calibrated and validated using water quality analysis simulation program (WASP) considering appropriate kinetic rate coefficients for Pusu River. Performance of the new reaeration rate equation and other process equations in the calibration and validation data was assessed in terms of root-mean-square error (RMSE), mean error between observed and predicted data and R2 value. Study results revealed that the equation developed in B Abdullah Al-Mamun [email protected] Md Nuruzzaman [email protected] http://waterzaman.weebly.com/ Md Noor Bin Salleh [email protected] 1 Department of Civil Engineering, Rangpur Engineering College, Rangpur 5403, Bangladesh 2 Department of Civil Engineering, Kulliyyah of Engineering, International Islamic University Malaysia (IIUM), 53100 Kuala Lumpur, Malaysia 3 Bioenvironmental Engineering Research Center (BERC), Kulliyyah of Engineering, International Islamic University Malaysia (IIUM), 53100 Kuala Lumpur, Malaysia this research considering the impact of culverts on reaeration rate predicted DO in Pusu River with improved accuracy as compared to the other equations. RMSEs were found to be 0.083 and 0.067 mg/L for calibration and validation data, respectively. Mean errors of observed and model-predicted data were 0.06 and 0.05 mg/L for calibration and validation,respectively. The R2 value was 0.99 in both cases. The study results facilitate accuracy in future studies on DO of Pusu River

    Map out the best BI tool and SPC software for a case company: selection and rationalization process

    Get PDF
    Business Intelligence (BI) is about delivering relevant and reliable information to the right people at the right time to make better decisions faster. To do this, I require methods and programs to collect unstructured data, convert it into information, and present it to improve business decisions. There are many types of BI tools available on the market that perform these tasks. On the other hand, Statistical Process Control (SPC) is used to control the process with the help of statistical tools and techniques. It helps me monitor and control the process, reduce variations in the process, improve product quality, and so on. In this thesis, my main goal is to find the best BI tools and SPC software for Valmet Automotive, a Finland-based automobile manufacturing company. Valmet Automotive is one of the world's top producers of automobiles and batteries. Our organization uses BI and SPC tools for data analysis, visualization, and processing. Nevertheless, we have faced challenges in employing these instruments at various periods. I started this thesis to find possible BI and SPC technologies for our organization so that we can solve the problem. Every software must satisfy ISO 9000-3, an international software development and quality assurance guideline. Furthermore, every software must maintain ISO-9126, a quality model that determines the evaluation of software quality. My software selection processes are based on ISO 9000-3 and ISO-9126. My first goal was to find the best BI tools and SPC software for our organization based on its specific needs. Therefore, to map out potential BI and SPC tools, I picked 30 BI tools and 13 SPC software from the huge amount of similar software in the market. At this point, all the needed map-out criteria and software names were organized in Excel, and I evaluated each tool on a scale of 5 for each criterion. I used math to get the total weighted score for each tool. Furthermore, I used mathematical computation methods to complete the procedure. After completing the whole procedure in Excel, I enlisted the assistance of code. Consequently, I used the Python programming language to calculate the overall score. After computation, using Python visualization libraries, I identified the top five BI tools out of thirty and the top three SPC tools out of thirteen SPC software. In the case study, I also found the best BI tool and SPC software for our company by using visualizations. In the findings, Microsoft Power BI was the best BI tool, and SPC for Excel was the best SPC software for our case company. Therefore, our case company, Valmet Automotive, chose Microsoft Power BI as its leading BI solution. In addition, the research involved a comprehensive statistical analysis, including descriptive statistics, correlation analysis, hypothesis testing, and regression analysis, to examine the selection process of SPC tools and BI tools in the case company. Mostly, the methods and calculations are done for our case company's best tool selection. However, in the future, any company that needs good quality BI tools or SPC software can follow my methodology and calculation methods to select the best BI tools and SPC software available on the market at that time for their needs

    Improvement of existing water quality index in Selangor, Malaysia.

    Get PDF
    A reviesd and improved water quality index (WQI) is proposed for the State of Selangor in Malaysia. Analyses were conducted to develop the rating curve, find the new subindices and weighing factors for the selected parameters. The water quality parameters were grouped according their similarities. The class of water gained at 1K07 (Klang River station) using new WQI equation and existing WQI equation was Class IV and Class III respectively. However, the statuses of the water for both cases were the same, where both of them were classified as polluted water. In other case, for example at 1L02 (Langat River station), the class resulted from the new and existing WQI was the same. But, the status of the water was different, where the new WQI indicated that the water was clean, while the existing WQI indicated that the water was slightly polluted. The proposed WQI seemed to be slightly more strict compared to the existing one, which is thought to bring benefit the river environment. The outcomes of this research can be applied to protect the rivers in Selangor

    Research progress in bioflocculants from bacteria

    Get PDF
    Although one of the major users of flocculants are water and wastewater treatment industries, flocculants are also used in various food industries. The chemical flocculants are preferred widely in these industries due to low production cost and fast production ability. However, the negative effects of the chemical flocculants should not be neglected to gain the economic benefits only. Therefore, the researchers are working to discover efficient and economical flocculants from biological sources. Several attempts have been made and are still being made to extract or produce bioflocculants from natural sources such as plants, bacteria, fungi, yeast, algae, etc. The review revealed that significant amount of work have been done in the past, in search of bioflocculant. However, commercially viable bioflocculants are yet to be marketed widely. With the advent of new biotechnologies and advances in genetic engineering, the researchers are hopeful to discover or develop commercially viable, safe and environmentfriendly bioflocculant
    corecore